Joosten, T. (2005). Exploring the Potential and Assessment Impact of Student Response Systems: A Student Survey. University of Wisconsin-Milwaukee, Learning Technology Center. Retrieved from:
http://mysurveys.wikispaces.com/Clickers




Abstract for EXEMPT Protocol
Student Response Systems (SRS): Exploring Potential and Assessing Impact

The purpose of this study is to evaluate the impact of student response systems (SRS) on teaching and learning. Student response systems, more commonly called clickers, are wireless communication systems that allow faculty to poll students and students respond using a clicker or response pad. In our study, we will utilize both quantitative and qualitative methodologies in evaluation of the project outcomes.

In this study, data will be collected from approximately 2500 students and 11 faculty members. The 11 faculty members have integrated student response systems into the course for the fall 2005 semester. The students will have participated in the courses and used the student response system technology. Students who participate in this study will be provided with an information sheet about the study. They can withdraw from this study during filling out the questionnaire for any reason. Filling out the questionnaire will be approximately 15 minutes. All data will be treated confidentially. No raw data will be shared. Grouped or "aggregate" data may be shared in class or published in professional journals.




First, we will be organizing various faculty focus groups in order to gather faculty reports on the SRS for analysis. Also, a thematic analysis of narratives will be conducted using individual comments and stories gathered during one-on-one training appointments and from email correspondence from faculty queries during the Project. For selected faculty participants, we will conduct a case study focusing on two courses conducted in parallel with one lecture utilizing a student response system and the other offered in a traditional setting without SRS. All faculty will remain anonymous in any written reports or publications that result from the study. In addition, we will administer a faculty survey containing a series of questions intended to measure instructor’s attitudes toward the SRS. The data collected will be anonymous. Surveys will be sent to faculty through campus mail and returned to me using campus mail. No identifying information is required on the faculty surveys.

Next, we will gather anonymous student evaluations by survey, which will contain a series of questions that asks about students' previous technology use, attitudes towards the use of SRS in the classroom, their evaluation of the course itself, their evaluation of the clicker technology itself, their perception of their learning in relation to SRS, and their performance in the course in relation to SRS. Questionnaires will be distributed to 2500 students who have participated in a course by one of the 11 faculty who are using student response systems in his or her classroom. Each faculty member will be given a packet with the number of questionnaire corresponding to the number of students he or she has taught and asked to distribute the questionnaire to the students. Each instructor’s packet will be coded with a letter so that questionnaires from the same classroom can be grouped for analysis. Survey will be collected by the faculty member and sent intercampus mail to the research at the Learning Technology Center. We will conduct various multivariate analyses in analyzing the data to produce quantifiable results of the effectiveness of the technology.

In addition to the data gathered through questionnaires, faculty will gather performance data, average grades and pass rates from previous semesters without SRS, to run comparisons as well as gather learning assessments in class. Also, faculty will gather student evaluation from current and previous semesters prior to SRS to run comparisons.

Survey data will be kept in a secure desk drawer in my office until the data is entered into a database. All surveys will be destroyed at the completion of the study.



This study is considered exempt under Category 1 because it involves research being conducted in a traditional education setting involving normal education practices and the effectives of instructional technique. Also, this study is exempt under Category 2 because it involves the user of survey procedures and interview procedures. Finally, this study is exempt because it involves research in which we will collect or study existing data, documents, and records.

/


Information Sheet

Student Response Systems

I am Tanya Joosten of the Learning Technology Center at the University of Wisconsin-Milwaukee. We are conducting a study of how student response systems (a.k.a. clickers) impact teaching and student learning. We would appreciate your participation in this study, as it will assist us in making recommendations for improving the teaching of students throughout the UWM campus.



If you agree to be in this study, you will be asked to fill out a survey and to answer several questions about your experience in using clickers. The survey should take 15 minutes to complete. There are over 2000 students at UWM participating in the study. This study will take place between September 2005 and January 2006. There are no known risks associated with your being in the study. Possible benefits are that you will have a voice in helping shape the instruction that students on the UWM campus receive.



The questionnaire you fill out will be treated confidentially. PLEASE DO NOT PUT YOUR NAME ANYWHERE ON THE SURVEY. There will be no way to link you to your responses. Data from this study will be shared with my classmates and teacher and may be published in professional journals. Only grouped data will be presented or published.



You do not have to be in the study. You can withdraw from this study during filling out the questionnaire for any reason. There is no penalty for withdrawing.



Once the study is completed, we would be glad to give the results to you. In the meantime, if you have any questions, please contact me:



Tanya Joosten
Learning Technology Center
University of Wisconsin-Milwaukee

PO Box 604

Milwaukee, WI 53201

tjoosten@uwm.edu
414.229.4319

If you have any complaints about your experience as a participant in this study, please call or write:



Chris Buth Furness

IRB Administrator

Institutional Review Board for the

Protection of Human Subjects

The Graduate School

University of Wisconsin-Milwaukee

PO Box 340, MIT 206

Milwaukee, WI 53201

414-229-3173 phone; 414-229-5000 fax

chrisb@uwm.edu



Although Ms. Furness will ask your name, all complaints are kept in confidence.



This research project has been approved by the University of Wisconsin-Milwaukee Institutional Review Board for the Protection of Human Subjects for a one year period.



STUDENT SURVEY

Filling out this survey indicates that I am at least 18 years old and that I am giving my informed consent to be a subject in this study.





Course Number, Course Name, Date, Time and Location:


Student ID

Demographics

Please indicate with an “X” which of the following best describes yourself

Age:
17 - 19 _
20 - 22 _
23 - 25 _
25+ _

Gender:
Male _
Female _

Race:
African American _
Latino _
Asian _
European American/Caucasian _
Other: Please identify

Marital Status:
Single _
Married _
Divorced/Separated _

Student Status:
Full-time _
Part-time _

Education Level:
Freshman _
Sophomore _
Junior _
Senior _

Have you previously taken a course that used clickers? Yes (1) No (2)

Did you use a clicker in this course? Yes (1) No (2)

Which clicker did you use? Turning Point Response Pad (1) CPS/eInstruction (2) Other (3)


Computer Use

Please rate your use of computers or computer software. There are no correct answers or preferred answers. Each item is rated on a 5-point scale ranging from frequently (5) to never (1).

(5) Frequently | (4) Often | (3) Sometimes | (2) Rarely (1) Never

  1. 1. _ How often do you use a computer at home?
  2. 2. _ How often do you use a computer at school?
  3. 3. _ How often do you use a computer at work?
  4. 4. _ How often do you use a computer in a library?
  5. 5. _ How often do you use a computer at a friend or family member’s house?
  6. 6. _ How often do you use a computer?
  7. 7. _ How often do you use the Internet?
  8. 8. _ How often do you use send or receive electronic mail?
  9. 9. _ How often do you use chat software/instant messenger (AOL, MSN, ICQ, etc.)?
10._ How often do you use a word processor (Word, WordPerfect)?
11._ How often do you use spreadsheets (Excel, Lotus)?
12._ How often do you use the computer for games?
13._ How often do you use the computer to view or edit graphics/photo images?
14._ How often do you use a courseware product (e.g., Desire2Learn, D2L)?


Student Response System Grant Evaluation: Student Survey Items

Note: The items would be randomized the survey. The reverse coded items are noted by (r) and are required to increase the reliability of this instrument. The headings in bold are specific to the variable they are measuring. Some of the survey items are taken from proven reliable surveys and are otherwise grounded in research examined. Once the survey is administered, I will run a factor analysis to make sure the items factor to their corresponding variable. Multiple regression will be run to analyze relationships between variables.

The following statements address different aspects of your experience in taking a course that uses a student response system or clickers. For each of the statements, please enter 5 if you strongly agree with the statement with a range to 1 if you strongly disagree with the statement.

(5) Strongly Agree | (4) Agree | (3) Neutral | (2) Disagree | (1) Strongly Disagree

note/ classroom interaction and student satisfaction with course instruments would be given to both classes using and not using SRS along with other course documents (exam grades, overall grade averages, etc.) for a parallel comparison

Classroom Interaction

(Communication, Participation, Engagement)

I felt involved in this course
I was engaged
I felt like an outsider (r)
I felt a part of the class
This course was alienating (r)
Using clickers helps me to pay attention in class
The classroom environment was very lively and active
This class was boring (r)
Communication with my classmates was good
Communication with my instructor was effective in class
I wish there was more opportunity to participate with my classmates (r)
I didn’t communicate very will with my instructor (r)
Amount of communication in the classroom was adequate
I would have liked to see more interactions in the class (r)
There is little misunderstanding.
There seemed to be confusion a lot of the time (r)
My participation in the course is frequent
I didn’t participate much in class (r)
My presence in this class makes no difference (r)
Students day dream, write letter, or read the newspaper during class (r)

Student Satisfaction with Course
(Instructor, Course Design, Course Management, Grades, Feedback, Subject, Peers)

Instructor

The instructor motivated active interaction
The instructor was not enthusiastic about the class (r)
The instructor facilitated authentic collaboration
The instructor failed to create a socially welcoming environment (r)
The instructor built rapport with us
I don’t feel like I had a good working relationship with my instructor (r)
The instructor stimulated my interest
My interests were never peaked in the class (r)
The instructor was available when I have questions
Many of my questions have gone unanswered (r)
The instructor was likeable
I don’t care for the instructor (r)
I feel the instructor is knowledgeable
The instructor doesn’t know what they were talking about(r)
Overall, my instructor was good.
I was able to easily communicate with my instructor.
Communication with my instructor was difficult (r)

Course Design

The class activities are carefully structured
The activities in class were unorganized (r)
The course objectives were clear
I didn’t understand the goals of the class (r)
The class was well organized
The class was chaotic (r)
The information in class was easy to manage
The content of this course is well arranged and logically presented
I couldn’t manage all the information I was receiving in class (r)
The instructor provided clear guidelines as to what needs to be done
Class guidelines did not help maintain order (r)
The instructor is not specific about deadlines (r)
The requirements for the course were clearly outlined
Class expectations were not clearly spelled out (r)
I didn’t know what was required to complete the course
Course concepts were taught in the context of real life experiences
There was not opportunity to share my real life experiences as they relate to the course material (r)
I was able to place abstract concepts in my own frame of reference
I could not relate to abstract concepts (r)
Other students’ description of concepts helped me understand the course material
Assignments were aligned with course objectives
Exams and quizzes were aligned with course objectives
The guidelines for evaluation we unclear (r)
Assignments did not allow me to demonstrate my learning (r)

Course Management

The instructor was prepared each day
The instructor never seemed prepared to teach (r)
The organization of the lessons was logical and easy to follow.
The presentations used in class needed to be better designed (r)
There were sufficient examples and non-examples to clarify instruction.
I felt that class time was wasted (r)
Class time was productive
The lectures were useful
Lectures were a waste of my time (r)
There was plenty of time to ask questions in class
I never had time to have my questions answered in class (r)
Help was available when I needed clarification of course materials.

Grades

I understood how to do well in the class
I didn’t know how to success in this class (r)
My grade reflected the quality and timeliness of my contribution
My grade reflected the quantity (number or size) of my contribution (r)
Grading was fair
It was very clear what students needed to do in order to make good grades in this class

Feedback

Instructor responded to my e-mails in a timely manner
Feedback from my instructor was clear and effective
The feedback I received lacked quality (r)
I need to receive feedback more frequently (r)
Instructor provided timely feedback
I did not receive constructive feedback on my assignments.

Subject

I liked this subject area
I would prefer not to take a class in this subject area in the future (r)
The course content was interesting

Peers

I enjoyed interactions with my peers
I didn’t have very good interactions with my peers (r)
There were people in the class with whom I would like to be friends
Relationships established among students in this class carry over outside the classroom
Students in this class have not gotten to know each other very well (r)

Satisfaction with Technology
(Acquisition, Support, Use, Training, Fun, Validity, Overall)

Acquiring a Clicker (Cost, Purchasing, and Registration)

The clicker was too expensive (r)
I feel the cost was worth it
It was easy to purchase my clicker at the bookstore
I didn’t understand how to get a clicker (r)
Registering my clicker was a simple process
I was a little lost when it came to registering my clicker (r)

Support

The Help Desk was able to answer any questions I had about using my clicker
The Help Desk was not able to solve my registration issue (r)
The Help Desk was able to resolve my problem with registering my clicker
The Help Desk did not answer my question about using the clicker (r)
My instructor was able to answer my questions about my clicker
My instructor did not have answers to my questions about my clicker (r)
Technical support was available when I needed it
I need technical support with my clicker (r)

Ease of Use

I had no problems with the student response system
Using the clickers was easy
Using the clickers is hard (r)
I don’t really understand how the response system work (r)
Programming my clicker was confusing (r)
I was able to program my clicker with no problems
Using my clicker in class was simple
I sometimes had difficulty using my clicker in class (r)

Training

The introductory explanations on how to use the technology were sufficient
The introductory explanations on how to use the clickers were clear
After the introduction, I still wasn’t sure how to use this technology (r)
Listening to the introductory explanations, I understood quickly how this technology works.

Fun




Using the clickers was fun
I always enjoyed using the new technology
Using clickers as a way of interacting in class is exciting
The technology was annoying (r)
I really enjoyed using the clickers.
I enjoyed this course
This course was a waste of my time (r)

Validity

Having to use the response systems influenced the way influenced the way I answered questions (r)
I never thought about the technology while answering questions.
I’m not sure my answers were accurate. (r)
Having to use clickers had no effect on how I answered the questions.
I wasn’t always able to answer the way I wanted because of the technology. (r)

Overall Effectiveness

Overall, I am happy with using clickers
Using clickers in a course is a waste of time (r)
I would take another course that used a response system
I will avoid classes using clickers in the future (r)
I would recommend that the instructor continue to use a response system
I would hope the instructor would think twice about using clickers in the future (r)
I would recommend this course to others
I would not recommend this course to a classmate (r)
The technology used in this course motivated me to learn
Compared with other courses, this course was excellent

Perceived Learning

Clickers have been beneficial to my learning
Clickers had no impact on my learning (r)
Clickers allow me to connect concepts to one another
The technology did not allow me understand concepts better (r)
Clickers allow me to apply concepts to real world situations
I don’t feel that what I learned using clickers is practical (r)
Clickers made it difficult for me to figure out what I didn’t know (r)
The technology helped me to focus my studying
Clickers didn’t help me figure out what I needed to study for (r)
The new technology really helped me better prepare for exams
The clickers were useless in preparing for exams (r)
Using the clickers reinforced my reading
The technology had nothing to do with my homework (r)
The new technology had no relation to my reading (r)
The student response systems helped emphasize what I learned in my homework
Clickers helped me understand the course material
Clickers helped me assess my knowledge
Clickers did help me assess the knowledge I had learned (r)
The clickers made it easy to connect course materials

Perceived Performance

Using the clickers helped me get a better grade in this class
The new technology made it difficulty for me to get a good grade in this class (r)
The clickers help me do better on my quizzes and/or exams
Using the clickers did not help be score higher on the quizzes and/or exams (r)
I got higher scores on my assignments and homework because of the new technology
Using clickers did not improve my homework grades (r)



PART II: Open-ended Questions

Placed at the end of the survey and to be coded for thematic analysis

What did you like best about your experience using clickers in this class?

What were some barriers to your use of clickers in this class?

What is one thing you would improve about your experience?

Would you recommend this course to others, if it used the same technology? Why or Why not?



Part III: Other Historical Data to Gather from Faculty to Assess Student Impact

Student performance variables:

(to be collected from current and previous semesters, if available)

  1. Drop/Retention Rate (Number of students as of 9.18 versus as of 12.28
  2. Pass Rate
  3. Individual Exam Average
  4. Overall Exam Average
  5. Overall Homework Average
  6. Overall Grade Average

Satisfaction variable:

  1. Average student evaluations (standard department or university teaching evaluations from current course and from previous semester, if taught the course before)

Please send the numbers in points and percentages, if available. Feel free to contact tjoosten@uwm.edu with any questions regarding gathering this data.

Due: January 15th

Perceived Performanc
Abstract for EXEMPT Protocol

Student Response Systems (SRS): Exploring Potential and Assessing Impact

The purpose of this study is to evaluate the impact of student response systems (SRS) on teaching and learning. Student response systems, more commonly called clickers, are wireless communication systems that allow faculty to poll students and students respond using a clicker or response pad. In our study, we will utilize both quantitative (multiple regression) and qualitative (thematic analysis) methodologies in evaluation of the project outcomes.
In this study, data will be collected from approximately 2500 students and 11 faculty members. The 11 faculty members have integrated student response systems into their courses for the fall 2005 semester. Approximately 2500 students are enrolled in the courses of the 11 faculty members; therefore, students participating in the study will have engaged in a course using the student response system technology. Students who participate in this study will be provided with an information sheet about the study and a questionnaire to complete. They can withdraw from this study for any reason at any time. Any data collected from that student will be destroyed and not used in the analysis. Filling out the questionnaire will be approximately 15 minutes. All data will be treated confidentially. No raw data will be shared. Grouped or "aggregate" data may be shared in class or published in professional journals.

Part I of the study will focus on the instructors’ experiences. First, we will be organizing various focus groups in order to gather faculty reports on the SRS for analysis. A thematic analysis of narratives will be conducted using the focus group data, individual comments and stories gathered during one-on-one training appointments, and email correspondence and queries during the Project. For selected faculty participants, we will conduct a case study focusing on two courses conducted in parallel with one lecture utilizing a student response system and the other offered in a traditional setting without SRS. All instructors will remain anonymous in any written reports or publications that result from the study. In addition, we will administer a faculty survey containing a series of questions intended to measure instructor’s attitudes toward the SRS. The data collected will be anonymous. Surveys will be sent to faculty through electronic mail and returned to me using electronic mail or campus mail. No identifying information is required on the faculty surveys.
Part II of this study focuses on student variables. We will gather anonymous student evaluations by survey, which will contain a series of questions that asks about students' previous technology use, attitudes towards the use of SRS in the classroom, their evaluation of the clicker technology itself, their perception of their learning in relation to SRS, and their performance in the course in relation to SRS. Questionnaires will be distributed to 2500 students who have participated in a course by one of the 11 faculty who are using student response systems in his or her classroom . Each faculty member will be given a packet with the number of questionnaire corresponding to the number of students he or she has taught and asked to distribute the questionnaire to the students. Each instructor’s packet will be coded with a letter so that questionnaires from the same classroom can be grouped for analysis. Survey will be collected by the faculty member and sent intercampus mail to the research at the Learning Technology Center . We will conduct various multivariate analyses in analyzing the data to produce quantifiable results of the effectiveness of the technology.
In addition to the data gathered through questionnaires, faculty will gather historical data, including performance data, average grades and pass rates, from the semester courses being studied as well as previous semesters without SRS in order to run comparisons as well as gather learning assessments in class. Also, faculty will gather student evaluation from current and previous semesters prior to SRS to run comparisons.

All survey data will be kept in a secure desk drawer in my office until the data is entered into a database. All surveys will be destroyed at the completion of the study.

This study is considered exempt under Category 1 because it involves research being conducted in a traditional education setting involving normal education practices and the effectives of instructional technique. Also, this study is exempt under Category 2 because it involves the user of survey procedures and interview procedures. Finally, this study is exempt because it involves research in which we will collect or study existing data, documents, and records.


Information Sheet
Student Response Systems


I am Tanya Joosten of the Learning Technology Center at the University of Wisconsin-Milwaukee. We are conducting a study of how student response systems (a.k.a. clickers) impact teaching and student learning. We would appreciate your participation in this study, as it will assist us in making recommendations for improving the teaching of students throughout the UWM campus.

If you agree to be in this study, you will be asked to fill out a survey and to answer several questions about your experience in using clickers. The survey should take 15 minutes to complete. There are over 2000 students at UWM participating in the study. This study will take place between September 2005 and January 2006. There are no known risks associated with your being in the study. Possible benefits are that you will have a voice in helping shape the instruction that students on the UWM campus receive.

The questionnaire you fill out will be treated confidentially. PLEASE DO NOT PUT YOUR NAME ANYWHERE ON THE SURVEY. There will be no way to link you to your responses. Data from this study will be shared with my classmates and teacher and may be published in professional journals. Only grouped data will be presented or published.

You do not have to be in the study. You can withdraw from this study during filling out the questionnaire for any reason. There is no penalty for withdrawing.

Once the study is completed, we would be glad to give the results to you. In the meantime, if you have any questions, please contact me:

Tanya Joosten

Learning Technology Center
University of Wisconsin-Milwaukee
PO Box 604
Milwaukee, WI 53201

__tjoosten@uwm.edu__

414.229.4319

If you have any complaints about your experience as a participant in this study, please call or write:

Chris Buth Furness
IRB Administrator
Institutional Review Board for the
Protection of Human Subjects
The Graduate School
University of Wisconsin-Milwaukee
PO Box 340, MIT 206
Milwaukee, WI 53201
414-229-3173 phone; 414-229-5000 fax

__chrisb@uwm.edu__

Although Ms. Furness will ask your name, all complaints are kept in confidence.

This research project has been approved by the University of Wisconsin-Milwaukee Institutional Review Board for the Protection of Human Subjects for a one year period.



STUDENT SURVEY
Filling out this survey indicates that I am at least 18 years old and that I am giving my informed consent to be a subject in this study.

1.
Course Number, Course Name, Date, Time and Location:
2.
Student ID
Demographics
Please indicate with an “X” which of the following best describes yourself
3. Age:
17 - 19 _

20 - 22
_

23 - 25 _

25+
_

4. Gender:
Male _

Female
_

5. Race:
African American _

Latino
_

Asian _

European American/Caucasian
_

Other: Please identify




6. Student Status:

Full-time
_

Part-time _

7. Education Level:

Freshman
_

Sophomore _

Junior
_

Senior _

8. Have you previously taken a course that used clickers? Yes (1) No (2)

9. Did you use a clicker in this course? Yes (1) No (2)

If yes to #9, please answer number 10.

If no to #9, please move onto the next session.

10. Which clicker system did you use?

Turning Point (1) CPS/eInstruction (2) Other (3)


Computer Use

Please rate your use of computers or computer software. There are no correct answers or preferred answers. Each item is rated on a 5-point scale ranging from frequently (5) to never (1).

(5) Frequently | (4) Often | (3) Sometimes | (2) Rarely | (1) Never




11.
_ How often do you use a computer at home?

12. _ How often do you use a computer at school?

13.
_ How often do you use a computer at work?

14. _ How often do you use a computer in a library?

15.
_ How often do you use a computer at a friend or family member’s house?

16. _ How often do you use a computer?

17.
_ How often do you use the Internet?

18. _ How often do you use send or receive electronic mail?

19.
_ How often do you use chat software/instant messenger (AOL, MSN, ICQ, etc.)?

20. _ How often do you use a word processor (Word, WordPerfect)?

21.
_ How often do you use spreadsheets (Excel, Lotus)?

22. _ How often do you use the computer for games?

23.
_ How often do you use the computer to view or edit graphics/photo images?

24. _ How often do you use a courseware product (e.g., Desire2Learn, D2L)?

Student Response System Grant Evaluation: Student Survey Items

Note: The following items would be randomized the survey. The reverse coded items are noted by (r) and are required to increase the reliability of this instrument. The headings in bold are specific to the variable they are measuring. Some of the survey items are taken from proven reliable surveys and are otherwise grounded in research examined. Once the survey is administered, I will run a factor analysis to make sure the items factor to their corresponding variable. Multiple regression will be run to analyze relationships between variables.


Sample instructions and Likert scale:

The following statements address different aspects of your experience in taking a course that uses a student response system or clickers. For each of the statements, please enter 5 if you strongly agree with the statement with a range to 1 if you strongly disagree with the statement.

(5) Strongly Agree | (4) Agree | (3) Neutral | (2) Disagree | (1) Strongly Disagree
Classroom Interaction

(Communication, Participation, Engagement)

Option 1: Insert a statement advising them to compare to another course.
Read Carefully. Please review each of the following statements. Each statement reflects a comparison to the classes you have taken that did not use clickers.

I felt involved in this course

I was engaged

I felt like an outsider (r)

I felt a part of the class

This course was alienating (r)

Using clickers helps me to pay attention in class

The classroom environment was very lively and active

This class was boring (r)

Communication with my classmates was good

Communication with my instructor was effective in class

I wish there was more opportunity to participate with my classmates (r)

I didn’t communicate very will with my instructor (r)

Amount of communication in the classroom was adequate

I would have liked to see more interactions in the class (r)

There is little misunderstanding.

There seemed to be confusion a lot of the time (r)

My participation in the course is frequent

I didn’t participate much in class (r)

My presence in this class makes no difference (r)

Students day dream, write letters, or read the newspaper during class (r)

Option 2: Rewrite of items, specifically addressing the clickers.

Clickers made me feel involved in the course

Clickers led me to be engaged in class

Clickers made me develop the feelings of an outsider (r)

I felt a part of the class because of using my clicker

Using clickers helps me to pay attention in class

The classroom environment was very lively and active thanks to the clickers

I would have liked to see more interactions in the class (r)

Clickers facilitated good communication with my classmates

The clickers did not provide an opportunity for me to participate with my classmates (r)

Clickers made communication with my instructor more effective in class

The clickers made it difficult to communicate very will with my instructor (r)

Clickers decreased misunderstandings about course material

Clickers seemed to cause confusion about the material a lot of the time (r)

Clickers increased the frequency of my participation in the course

The clickers did not impact my participation much in class (r)
Satisfaction with Technology

(Acquisition, Support, Use, Training, Fun, Validity, Overall)

Acquiring a Clicker (Cost, Purchasing, and Registration)

The clicker was too expensive for its purpose (r)

I feel the cost was worth it

It was easy to purchase my clicker at the bookstore

I didn’t understand how to get a clicker (r)

Registering my clicker was a simple process

I was a little lost when it came to registering my clicker (r)

Support

The Help Desk was able to answer any questions I had about using my clicker

The Help Desk did not answer my question about using the clicker (r)

My instructor/TA was able to answer my questions about my clicker

My instructor/TA did not have answers to my questions about my clicker (r)

Technical support was available when I needed it

I needed better technical support with my clicker (r)

Ease of Use

I had no problems using the clickers

Programming my clicker was confusing (r)

Programming my clicker was easy too similar to reverse, what about “It was easy to program my clicker”

Using my clicker in class was simple

I sometimes had difficulty using my clicker in class (r)

Training

The introductory explanations on how to use the technology were sufficient

The introductory explanations on how to use the clickers were clear

After the introduction, I still wasn’t sure how to use this technology (r)


Fun


Using the clickers was fun

Using clickers as a way of interacting in class is exciting

The technology was annoying (r)

I really enjoyed using the clickers

Validity

I was never certain that my clicker response was received by the system (r)

Having to use clickers had no effect on how I answered the questions

I never thought about the technology while answering questions

I lost
credit in class because by clicker response didn’t register with the system (r)


Additional statements Alan had:

I didn’t feel like 1 or 3 were validity statements. We might want to insert somewhere else (e.g., is #3 a classroom interaction issue?)

Overall Effectiveness

Overall, I am happy with using clickers

Using clickers in a course is a waste of time (r)

I would take another course that used a response system

I will avoid classes using clickers in the future (r)

I would recommend that the instructor continue to use a response system

I would hope the instructor would think twice about using clickers in the future (r)

I would recommend this course to others

I would not recommend this course to a classmate (r)

The technology used in this course motivated me to learn
Perceived Learning

Clickers have been beneficial to my learning

Clickers had little impact on my learning (r)

Clickers allow me to better understand concepts

The technology did not allow me understand concepts better (r)

The new technology really helped me better prepare for exams

The clickers were not much help in preparing for exams (r)

Clickers helped me understand the course material

Clickers helped me get instant feedback on what I knew and didn’t know

Clickers failed to help me identify what I still needed to learn (r)

The clickers made it easy to connect ideas together

Using clickers helped me think more deeply about course materials
Perceived Performance__
Using the clickers helped me get a better grade in this class
The new technology made it difficult for me to get a good grade in this class (r)
The clickers helped me do better on my exams
Using the clickers did not help be score higher on the exams (r)
I got higher scores on my assignments because of the new technology
Using clickers did not improve my assignment grades (r)
PART II: Open-ended Questions
Placed at the end of the survey and to be coded for thematic analysis
What did you like best about your experience using clickers in this class?
What were some barriers to your use of clickers in this class?
What is one thing you would improve about your experience?
Would you recommend this course to others, if it used the same technology? Why or Why not?
Part III: Other Historical Data to Gather from Faculty to Assess Student Impact
Student performance variables:


(to be collected from current and previous semesters, if available)
1. Drop/Retention Rate (Number of students as of 9.18 versus as of 12.28
2. Pass Rate
3. Individual Exam Average
4. Overall Exam Average
5. Overall Homework Average
6. Overall Grade Average
Satisfaction variable:
1. Average student evaluations (standard department or university teaching evaluations from current course and from previous semester, if taught the course before)
Please send the numbers in points and percentages, if available. Feel free to contact __tjoosten@uwm.edu__ with any questions regarding gathering this data. Due: January 15th

We need to discuss delivery methods of the survey. Implications of delivery via SRS, web-based,d2l, and paper based.

See -->
Evaluating a Wireless Course Feedback System:The Role of Demographics, Expertise, Fluency, Competency, and UsageRonald E. RiceUlla BunzDepartment of CommunicationSchool of Communication, Information & Library ScienceRutgers University4 Huntington St., New Brunswick, NJ 08901-1071rrice@scils.rutgers.edu; bunz@scils.rutgers.edufax: 732-932-3756We acknowledge the generous support of this project by the Office of the Vice President forUndergraduate Education.

http://www.google.com/url?sa=t&source=web&cd=5&ved=0CC8QFjAE&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.84.7753%26rep%3Drep1%26type%3Dpdf&ei=6vSlTYLkKenL0QGIrryFCQ&usg=AFQjCNGiAGA0VPMrJ3IVrG6jkI5fZQDROA